Determining the effective dimensionality of the genetic variance-covariance matrix.

نویسندگان

  • Emma Hine
  • Mark W Blows
چکیده

Determining the dimensionality of G provides an important perspective on the genetic basis of a multivariate suite of traits. Since the introduction of Fisher's geometric model, the number of genetically independent traits underlying a set of functionally related phenotypic traits has been recognized as an important factor influencing the response to selection. Here, we show how the effective dimensionality of G can be established, using a method for the determination of the dimensionality of the effect space from a multivariate general linear model introduced by Amemiya (1985). We compare this approach with two other available methods, factor-analytic modeling and bootstrapping, using a half-sib experiment that estimated G for eight cuticular hydrocarbons of Drosophila serrata. In our example, eight pheromone traits were shown to be adequately represented by only two underlying genetic dimensions by Amemiya's approach and factor-analytic modeling of the covariance structure at the sire level. In contrast, bootstrapping identified four dimensions with significant genetic variance. A simulation study indicated that while the performance of Amemiya's method was more sensitive to power constraints, it performed as well or better than factor-analytic modeling in correctly identifying the original genetic dimensions at moderate to high levels of heritability. The bootstrap approach consistently overestimated the number of dimensions in all cases and performed less well than Amemiya's method at subspace recovery.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

Compression of Breast Cancer Images By Principal Component Analysis

The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN  of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most      relevant information of X. These eigenvectors are called principal components [8]. Ass...

متن کامل

A phylogenetic approach to determining the importance of constraint on phenotypic evolution in the neotropical lizard Anolis cristatellus

Question: Is the pattern of phenotypic divergence among populations influenced by constraint in the form of the genetic covariances among characters? Background: Quantitative genetic theory predicts that when evolutionary lineages diverge simultaneously by genetic drift, the pattern of among-population divergence will parallel the pattern of within-population genetic variation and covariation. ...

متن کامل

Recent Results about the Largest Eigenvalue of Random Covariance Matrices and Statistical Application∗

Sample covariance matrices are a fundamental tool of multivariate statistics. After data collection, we get an n×p data matrix X. We will call n the number of observations and p the number of predictors. The rows of X are assumed to be realizations of a random variable whose covariance structure is Σ p. For practical applications, one often wishes to estimate Σp in order to understand the depen...

متن کامل

Small Sample Size in High Dimensional Space - Minimum Distance Based Classification

In this paper we present some new results concerning the classification in small sample high dimensional case. We discuss geometric properties of data structures in high dimensions. It is known that such a data form in high dimension an almost regular simplex even if co-variance structure of data is not unity. We restrict our attention to two class discrimination problems. It is assumed that ob...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Genetics

دوره 173 2  شماره 

صفحات  -

تاریخ انتشار 2006